GAAF: Searching Activation Functions for Binary Neural Networks Through Genetic Algorithm

نویسندگان

چکیده

Binary neural networks (BNNs) show promising utilization in cost and power-restricted domains such as edge devices mobile systems. This is due to its significantly less computation storage demand, but at the of degraded performance. To close accuracy gap, this paper we propose add a complementary activation function (AF) ahead sign based binarization, rely on genetic algorithm (GA) automatically search for ideal AFs. These AFs can help extract extra information from input data forward pass, while allowing improved gradient approximation backward pass. Fifteen novel are identified through our GA-based search, most them performance (up 2.54% ImageNet) when testing different datasets network models. Interestingly, periodic functions key component discovered AFs, which rarely exist human designed Our method offers approach designing general application-specific BNN architecture. GAAF will be released GitHub.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamical Control by Recurrent Neural Networks through Genetic Algorithm

1. Introduction Dynamic control problems often require a set of control rules. For example, an inverted pendulum system requires two different control rules for swinging up and stabilization of a pendulum. Recurrent neural networks (RNNs) are potential candidates for service as controllers of such complex tasks. RNNs memorize, recall and discriminate time-series information in a parallel way, e...

متن کامل

Stochastic Neural Networks with Monotonic Activation Functions

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of t...

متن کامل

Deep Neural Networks with Multistate Activation Functions

We propose multistate activation functions (MSAFs) for deep neural networks (DNNs). These MSAFs are new kinds of activation functions which are capable of representing more than two states, including the N-order MSAFs and the symmetrical MSAF. DNNs with these MSAFs can be trained via conventional Stochastic Gradient Descent (SGD) as well as mean-normalised SGD. We also discuss how these MSAFs p...

متن کامل

Hardness Optimization for Al6061-MWCNT Nanocomposite Prepared by Mechanical Alloying Using Artificial Neural Networks and Genetic Algorithm

Among artificial intelligence approaches, artificial neural networks (ANNs) and genetic algorithm (GA) are widely applied for modification of materials property in engineering science in large scale modeling. In this work artificial neural network (ANN) and genetic algorithm (GA) were applied to find the optimal conditions for achieving the maximum hardness of Al6061 reinforced by multiwall car...

متن کامل

a hybrid neural networks-coevolution genetic algorithm for multi variables robust design problem in quality engineering

in this study, a hybrid algorithm is presented to tackle multi-variables robust design problem. the proposed algorithm comprises neural networks (nns) and co-evolution genetic algorithm (cga) in which neural networks are as a function approximation tool used to estimate a map between process variables. furthermore, in order to make a robust optimization of response variables, co-evolution algor...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Tsinghua Science & Technology

سال: 2023

ISSN: ['1878-7606', '1007-0214']

DOI: https://doi.org/10.26599/tst.2021.9010084